28 research outputs found

    Coordinating attention requires coordinated senses

    Get PDF
    From playing basketball to ordering at a food counter, we frequently and effortlessly coordinate our attention with others towards a common focus: we look at the ball, or point at a piece of cake. This non-verbal coordination of attention plays a fundamental role in our social lives: it ensures that we refer to the same object, develop a shared language, understand each other’s mental states, and coordinate our actions. Models of joint attention generally attribute this accomplishment to gaze coordination. But are visual attentional mechanisms sufficient to achieve joint attention, in all cases? Besides cases where visual information is missing, we show how combining it with other senses can be helpful, and even necessary to certain uses of joint attention. We explain the two ways in which non-visual cues contribute to joint attention: either as enhancers, when they complement gaze and pointing gestures in order to coordinate joint attention on visible objects, or as modality pointers, when joint attention needs to be shifted away from the whole object to one of its properties, say weight or texture. This multisensory approach to joint attention has important implications for social robotics, clinical diagnostics, pedagogy and theoretical debates on the construction of a shared world

    Confidence is higher in touch than in vision in cases of perceptual ambiguity

    Get PDF
    The inclination to touch objects that we can see is a surprising behaviour, given that vision often supplies relevant and sufficiently accurate sensory evidence. Here we suggest that this 'fact-checking' phenomenon could be explained if touch provides a higher level of perceptual certainty than vision. Testing this hypothesis, observers explored inverted T-shaped stimuli eliciting the Vertical-horizontal illusion in vision and touch, which included clear-cut and ambiguous cases. In separate blocks, observers judged whether the vertical bar was shorter or longer than the horizontal bar and rated the confidence in their judgments. Decisions reached by vision were objectively more accurate than those reached by touch with higher overall confidence ratings. However, while confidence was higher for vision rather than for touch in clear-cut cases, observers were more confident in touch when the stimuli were ambiguous. This relative bias as a function of ambiguity qualifies the view that confidence tracks objective accuracy and uses a comparable mapping across sensory modalities. Employing a perceptual illusion, our method disentangles objective and subjective accuracy showing how the latter is tracked by confidence and point towards possible origins for 'fact checking' by touch

    Voice over: Audio-visual congruency and content recall in the gallery setting

    Get PDF
    Experimental research has shown that pairs of stimuli which are congruent and assumed to 'go together' are recalled more effectively than an item presented in isolation. Will this multisensory memory benefit occur when stimuli are richer and longer, in an ecological setting? In the present study, we focused on an everyday situation of audio-visual learning and manipulated the relationship between audio guide tracks and viewed portraits in the galleries of the Tate Britain. By varying the gender and narrative style of the voice-over, we examined how the perceived congruency and assumed unity of the audio guide track with painted portraits affected subsequent recall. We show that tracks perceived as best matching the viewed portraits led to greater recall of both sensory and linguistic content. We provide the first evidence that manipulating crossmodal congruence and unity assumptions can effectively impact memory in a multisensory ecological setting, even in the absence of precise temporal alignment between sensory cues

    Contingent sounds change the mental representation of one’s finger length

    Get PDF
    Mental body-representations are highly plastic and can be modified after brief exposure to unexpected sensory feedback. While the role of vision, touch and proprioception in shaping body-representations has been highlighted by many studies, the auditory influences on mental body-representations remain poorly understood. Changes in body-representations by the manipulation of natural sounds produced when one’s body impacts on surfaces have recently been evidenced. But will these changes also occur with non-naturalistic sounds, which provide no information about the impact produced by or on the body? Drawing on the well-documented capacity of dynamic changes in pitch to elicit impressions of motion along the vertical plane and of changes in object size, we asked participants to pull on their right index fingertip with their left hand while they were presented with brief sounds of rising, falling or constant pitches, and in the absence of visual information of their hands. Results show an “auditory Pinocchio” effect, with participants feeling and estimating their finger to be longer after the rising pitch condition. These results provide the first evidence that sounds that are not indicative of veridical movement, such as non-naturalistic sounds, can induce a Pinocchio-like change in body-representation when arbitrarily paired with a bodily action

    Reciprocity and alignment: quantifying coupling in dynamic interactions

    No full text
    Recent accounts of social cognition focus on how we do things together, suggesting that becoming aligned relies on a reciprocal exchange of information. The next step is to develop richer computational methods that quantify the degree of coupling and describe the nature of the information exchange. We put forward a definition of coupling, comparing it to related terminology and detail, available computational methods and the level of organization to which they pertain, presenting them as a hierarchy from weakest to richest forms of coupling. The rationale is that a temporally coherent link between two dynamical systems at the lowest level of organization sustains mutual adaptation and alignment at the highest level. Postulating that when we do things together, we do so dynamically over time and we argue that to determine and measure instances of true reciprocity in social exchanges is key. Along with this computationally rich definition of coupling, we present challenges for the field to be tackled by a diverse community working towards a dynamic account of social cognition

    Sensing the body through sound

    No full text
    We acknowledge funding by the Spanish Agencia Estatal de Investigación (PID2019-105579RB-I00 / AEI/10.13039/501100011033) and the European Research Council (ERCCoG-101002711) to AT. OD is funded by the NOMIS foundation (DISE grant)

    The importance of integration and top-down salience when listening to complex multi-part musical stimuli

    No full text
    In listening to multi-part music, auditory streams can be attended to either selectively or globally. More specifically, musicians rely on prioritized integrative attention which incorporates both stream segregation and integration to assess the relationship between concurrent parts. In this fMRI study, we used a piano duet to investigate which factors of a leader-follower relationship between parts grab the listener's attention and influence the perception of multi-part music. The factors considered included the structural relationship between melody and accompaniment as well as the temporal relationship (asynchronies) between parts. The structural relationship was manipulated by cueing subjects to the part of the duet that had to be prioritized. The temporal relationship was investigated by synthetically shifting the onset times of melody and accompaniment to either a consistent melody or accompaniment lead. The relative importance of these relationship factors for segregation and integration as attentional mechanisms was of interest. Participants were required to listen to the cued part and then globally assess if the prioritized stream was leading or following compared to the second stream. Results show that the melody is judged as more leading when it is globally temporally ahead whereas the accompaniment is not judged as leading when it is ahead. This bias may be a result of the interaction of salience of both leader-follower relationship factors. Interestingly, the corresponding interaction effect in the fMRI-data yields an inverse bias for melody in a fronto-parietal attention network. Corresponding parameter estimates within the dlPFC and right IPS show higher neural activity for attending to melody when listening to a performance without a temporal leader, pointing to an interaction of salience of both factors in listening to music. Both frontal and parietal activation implicate segregation and integration mechanisms and a top-down influence of salience on attention and the perception of leader-follower relations in music

    Segregation and integration of auditory streams when listening to multi-part music

    Get PDF
    In our daily lives, auditory stream segregation allows us to differentiate concurrent sound sources and to make sense of the scene we are experiencing. However, a combination of segregation and the concurrent integration of auditory streams is necessary in order to analyze the relationship between streams and thus perceive a coherent auditory scene. The present functional magnetic resonance imaging study investigates the relative role and neural underpinnings of these listening strategies in multi-part musical stimuli. We compare a real human performance of a piano duet and a synthetic stimulus of the same duet in a prioritized integrative attention paradigm that required the simultaneous segregation and integration of auditory streams. In so doing, we manipulate the degree to which the attended part of the duet led either structurally (attend melody vs. attend accompaniment) or temporally (asynchronies vs. no asynchronies between parts), and thus the relative contributions of integration and segregation used to make an assessment of the leader-follower relationship. We show that perceptually the relationship between parts is biased towards the conventional structural hierarchy in western music in which the melody generally dominates (leads) the accompaniment. Moreover, the assessment varies as a function of both cognitive load, as shown through difficulty ratings and the interaction of the temporal and the structural relationship factors. Neurally, we see that the temporal relationship between parts, as one important cue for stream segregation, revealed distinct neural activity in the planum temporale. By contrast, integration used when listening to both the temporally separated performance stimulus and the temporally fused synthetic stimulus resulted in activation of the intraparietal sulcus. These results support the hypothesis that the planum temporale and IPS are key structures underlying the mechanisms of segregation and integration of auditory streams, respectively

    Leading the follower : an fMRI investigation of dynamic cooperativity and leader-follower strategies in synchronization with an adaptive virtual partner

    No full text
    From everyday experience we know that it is generally easier to interact with someone who adapts to our behavior. Beyond this, achieving a common goal will very much depend on who adapts to whom and to what degree. Therefore, many joint action tasks such as musical performance prove to be more successful when defined leader-follower roles are established. In the present study, we present a novel approach to explore the mechanisms of how individuals lead and, using functional magnetic resonance imaging (fMRI), probe the neural correlates of leading. Specifically, we implemented an adaptive virtual partner (VP), an auditory pacing signal, with which individuals were instructed to tap in synchrony while maintaining a steady tempo. By varying the degree of temporal adaptation (period correction) implemented by the VP, we manipulated the objective control individuals had to exert to maintain the overall tempo of the pacing sequence (which was prone to tempo drift with high levels of period correction). Our imaging data revealed that perceiving greater influence and leading are correlated with right lateralized frontal activation of areas involved in cognitive control and self-related processing. Using participants' subjective ratings of influence and task difficulty, we classified a subgroup of our cohort as "leaders", individuals who found the task of synchronizing easier when they felt more in control. Behavioral tapping measures showed that leaders employed less error correction and focused more on self-tapping (prioritizing the instruction to maintain the given tempo) than on the stability of the interaction (prioritizing the instruction to synchronize with the VP), with correlated activity in areas involved in self-initiated action including the pre-supplementary motor area
    corecore